Linear Coordinate-Descent Message Passing for Quadratic Optimization

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Min-Sum-Min Message-Passing for Quadratic Optimization

We study the minimization of a quadratic objective function in a distributed fashion. It is known that the min-sum algorithm can be applied to solve the minimization problem if the algorithm converges. We propose a min-summin message-passing algorithm which includes the min-sum algorithm as a special case. As the name suggests, the new algorithm involves two minimizations in each iteration as c...

متن کامل

Convergence of Min-Sum Message Passing for Quadratic Optimization

We establish the convergence of the min-sum message passing algorithm for minimization of a quadratic objective function given a convex decomposition. Our results also apply to the equivalent problem of the convergence of Gaussian belief propagation.

متن کامل

Message-passing algorithms for quadratic minimization

Gaussian belief propagation (GaBP) is an iterative algorithm for computing the mean (and variances) of a multivariate Gaussian distribution, or equivalently, the minimum of a multivariate positive definite quadratic function. Sufficient conditions, such as walk-summability, that guarantee the convergence and correctness of GaBP are known, but GaBP may fail to converge to the correct solution gi...

متن کامل

Rescaled Coordinate Descent Methods for Linear Programming

We propose two simple polynomial-time algorithms to find a positive solution to Ax = 0. Both algorithms iterate between coordinate descent steps similar to von Neumann’s algorithm, and rescaling steps. In both cases, either the updating step leads to a substantial decrease in the norm, or we can infer that the condition measure is small and rescale in order to improve the geometry. We also show...

متن کامل

Stochastic Coordinate Descent for Nonsmooth Convex Optimization

Stochastic coordinate descent, due to its practicality and efficiency, is increasingly popular in machine learning and signal processing communities as it has proven successful in several large-scale optimization problems , such as l1 regularized regression, Support Vector Machine, to name a few. In this paper, we consider a composite problem where the nonsmoothness has a general structure that...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Computation

سال: 2012

ISSN: 0899-7667,1530-888X

DOI: 10.1162/neco_a_00368